Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Aspect level sentiment classification model with location weight and long-short term memory based on attention-over-attention
WU Ting, CAO Chunping
Journal of Computer Applications    2019, 39 (8): 2198-2203.   DOI: 10.11772/j.issn.1001-9081.2018122565
Abstract698)      PDF (847KB)(436)       Save
The traditional attention-based neural network model can not effectively pay attention to aspect features and sentiment information, and context words of different distances or different directions have different contributions to the sentiment polarity assessment of aspect words. Aiming at these problems, Location Weight and Attention-Over-Attention Long-short Term Memory (LWAOA-LSTM) model was proposed. Firstly, the location weight information was added to the word vectors. Then Long-Short Term Memory (LSTM) network was used to simultaneously model aspects and sentences to generate aspect representation and sentence representation, and the aspect and sentence representations were learned simultaneously through attention-over-attention module to obtain the interactions from the aspect to the text and from the text to the aspect, and the important part of the sentence was automatically paid attention to. Finally, the experiments were carried out on different thematic datasets of attractions, catering and accommodation, and the accuracy of the aspect level sentiment analysis by the model was verified. Experimental results show that the accuracy of the model on the datasets of attractions, catering and accommodation is 78.3%, 80.6% and 82.1% respectively, and LWAOA-LSTM has better performance than traditional LSTM network model.
Reference | Related Articles | Metrics